RNN-Test: Towards Adversarial Testing for Recurrent Neural Network Systems

نویسندگان

چکیده

While massive efforts have been investigated in adversarial testing of convolutional neural networks (CNN), for recurrent (RNN) is still limited and leaves threats vast sequential application domains. In this paper, we propose an framework RNN-Test RNN systems, focusing on sequence-to-sequence (seq2seq) tasks widespread deployments, not only classification First, design a novel search methodology customized models by maximizing the inconsistency states against their inner dependencies to produce inputs. Next, introduce two state-based coverage metrics according distinctive structure RNNs exercise more system behaviors. Finally, solves joint optimization problem maximize state coverage, crafts inputs various different kinds For evaluations, apply four common structures. On tested models, approach demonstrated be competitive generating inputs, outperforming FGSM-based DLFuzz-based methods reduce model performance sharply with 2.78% 37.94% higher success (or generation) rate. could also achieve 52.65% 66.45% adversary rate than testRNN MNIST LSTM model, as well 53.76% 58.02% perplexity 16% generation DeepStellar PTB language model.Compared traditional neuron proposed guidance excel 4.17% 97.22%

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

C-RNN-GAN: Continuous recurrent neural networks with adversarial training

Generative adversarial networks have been proposed as a way of efficiently training deep generative neural networks. We propose a generative adversarial model that works on continuous sequential data, and apply it by training it on a collection of classical music. We conclude that it generates music that sounds better and better as the model is trained, report statistics on generated music, and...

متن کامل

Development of A New Recurrent Neural Network Toolbox (RNN-Tool)

In this report, we developed a new recurrent neural network toolbox, including the recurrent multilayer perceptron structure and its companying extended Kalman filter based training algorithms: BPTT-GEKF and BPTT-DEKF. Besides, we also constructed programs for designing echo state network with single reservoir, together with the offline linear regression based training algorithm. We name this t...

متن کامل

A Recurrent Neural Network Model for Solving Linear Semidefinite Programming

In this paper we solve a wide rang of Semidefinite Programming (SDP) Problem by using Recurrent Neural Networks (RNNs). SDP is an important numerical tool for analysis and synthesis in systems and control theory. First we reformulate the problem to a linear programming problem, second we reformulate it to a first order system of ordinary differential equations. Then a recurrent neural network...

متن کامل

Towards Deep Neural Network Architectures Robust to Adversarial Examples

Recent work has shown deep neural networks (DNNs) to be highly susceptible to well-designed, small perturbations at the input layer, or so-called adversarial examples. Taking images as an example, such distortions are often imperceptible, but can result in 100% mis-classification for a state of the art DNN. We study the structure of adversarial examples and explore network topology, pre-process...

متن کامل

E-RNN: Entangled Recurrent Neural Networks for Causal Prediction

We propose a novel architecture of recurrent neural networks (RNNs) for causal prediction which we call Entangled RNN (E-RNN). To issue causal predictions, E-RNN can propagate the backward hidden states of Bi-RNN through an additional forward hidden layer. Unlike a 2-layer RNNs, all the hidden states of E-RNN depend on all the inputs seen so far. Furthermore, unlike a Bi-RNN, for causal predict...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Software Engineering

سال: 2022

ISSN: ['0098-5589', '1939-3520', '2326-3881']

DOI: https://doi.org/10.1109/tse.2021.3114353